Vault Door

California Assembly Weighs Nation’s Broadest AI-Driven Workplace Surveillance Bill: AB 1221 Raises the Bar, and the Stakes, for Employers

By Mari Clifford and Scott Hall 

In a move that could reshape day-to-day people-management practices across the state, the California Legislature is advancing Assembly Bill 1221 (“AB 1221”), a sweeping proposal that would regulate how employers deploy artificial intelligence-enabled monitoring tools and how they handle the torrents of data those tools generate. After clearing two policy committees, the measure was placed on the Assembly Appropriations Committee’s “suspense” file on May 14, 2025: a key fiscal hurdle to a possible floor vote. AB 1221’s fiscal impact will be scrutinized in the Appropriations Committee, and the bill could still be amended; perhaps to narrow its scope or clarify open questions such as what constitutes a “significant update” to an existing tool. Nonetheless, the measure enjoys strong labor support and dovetails with California’s broader push to regulate AI. Even if AB 1221 stalls, its core concepts are likely to resurface.

What AB 1221 Would Require

The bill defines a “workplace surveillance tool” broadly to include virtually any technology that actively or passively captures worker data, from innocuous time-tracking widgets to sophisticated photo-optical systems. It would obligate employers (public and private, large and small, as well as their labor-contractor intermediaries) to furnish plain-language written notice at least thirty days before launching any such tool. That notice must spell out the categories of data collected, the business purpose, the frequency and duration of monitoring, retention periods, vendor identities, the extent to which the data informs employment decisions, and the process by which workers may access or correct that data.

Once a surveillance system is up and running, it may collect, use, and retain information that is “reasonably necessary and proportionate” to the purpose identified in the notice, and employers bear joint liability for security breaches involving worker data. Contracts with analytics providers therefore must incorporate robust cybersecurity safeguards, cooperation duties and deletion obligations. Vendors must return worker data “in a user-friendly format” at contract end and delete remaining copies.

AB 1221 would prohibit facial recognition, gait analysis, emotion detection and neural-data collection, but with one narrow carve-out: facial recognition may still be used solely to unlock a device or grant access to a locked or secured area. The bill also bars employers from using surveillance to infer protected traits such as immigration status, health or reproductive history, religion, sexual orientation, disability, criminal record or credit history.

Employers may not rely primarily on monitoring data when disciplining or terminating a worker. If they choose to factor that data into such a decision, a human reviewer must corroborate it. The employer must notify the worker of the decision, provide a simple request form, and give the worker five business days to ask for the surveillance and corroborating records. Any valid correction must be made, and the personnel action adjusted, within twenty-four hours. Records that play any role in discipline must be retained for five years.

Enforcement Mechanisms and Civil Exposure

AB 1221 would vest enforcement authority in the Labor Commissioner, impose civil penalties of $500 per violation, and create a private right of action that includes actual and punitive damages as well as attorneys’ fees and punitive damages. Public prosecutors could also bring suit, and plaintiffs could seek injunctive relief, heightening litigation leverage for worker-side counsel.

Points of Contention and Legislative Headwinds

Industry groups, including the California chapter of SHRM, have criticized the proposal’s breadth, warning that it could hamper legitimate safety and operational uses of technology and saddle businesses with ambiguous compliance obligations. Labor advocates counter that AB 1221 supplies essential guardrails against what they describe as an exploding “digital Taylorism” that erodes privacy and exacerbates bias.

Practical Implications for Employers

If enacted, the bill would force employers to inventory every monitoring technology—no matter how routine—and to recalibrate vendor contracts, internal policies and disciplinary protocols. Multistate employers that already comply with New York City’s automated-employment-decision rules or the EU’s AI Act would confront new obligations around thirty-day advance notice, categorical technology bans and accelerated employee-data-access timelines. Because the measure’s private right of action is untethered to data-breach harm, plaintiffs’ lawyers would gain a fresh litigation hook wherever monitoring intersects with hiring, promotion or termination decisions.

Takeaways

Employers should begin mapping every data stream generated by workplace technologies, updating privacy notices and embedding human review into any algorithmically informed employment decision. Whether AB 1221 becomes law this session or next, the legislative trajectory is clear: AI-powered surveillance is migrating from operational convenience to regulated activity, and businesses that fail to get ahead of these requirements risk both regulatory penalties and private lawsuits.